Fast computation of sums of Gaussians
نویسندگان
چکیده
Evaluating sums of multivariate Gaussian kernels is a key computational task in many problems in computational statistics and machine learning. The computational cost of the direct evaluation of such sums scales as the product of the number of kernel functions and the evaluation points. The fast Gauss transform proposed by Greengard and Strain (1991) is a ǫ-exact approximation algorithm that reduces the computational complexity of the evaluation of the sum of N Gaussians at M points in d dimensions from O(MN) to O(M + N). However, the constant factor in O(M + N) grows exponentially with increasing dimensionality d, which makes the algorithm impractical for dimensions greater than three. In this paper we present a new algorithm where the constant factor is reduced to asymptotically polynomial order. The reduction is based on a new multivariate Taylor’s series expansion (which can act both as a local as well as a far field expansion) scheme combined with the efficient space subdivision using the k-center algorithm. The proposed method differs from the original fast Gauss transform in terms of a different factorization, efficient space subdivision, and the use of different error bounds for each source point. Algorithm details, error bounds, a procedure to choose the parameters, and numerical experiments are presented. We also compare our algorithm with the dual-tree algorithm of Gray and Moore (2003). As an example we show how the proposed method can be used for very fast ǫ-exact multivariate kernel density estimation.
منابع مشابه
Fast computation of sums of Gaussians in high dimensions
Evaluating sums of multivariate Gaussian kernels is a key computational task in many problems in computational statistics and machine learning. The computational cost of the direct evaluation of such sums scales as the product of the number of kernel functions and the evaluation points. The fast Gauss transform proposed by Greengard and Strain (1991) is a 2-exact approximation algorithm that re...
متن کاملImproved Fast Gauss Transform and Efficient Kernel Density Estimation
Evaluating sums of multivariate Gaussians is a common computational task in computer vision and pattern recognition, including in the general and powerful kernel density estimation technique. The quadratic computational complexity of the summation is a significant barrier to the scalability of this algorithm to practical applications. The fast Gauss transform (FGT) has successfully accelerated ...
متن کاملImplementation of Continuous Bayesian Networks Using Sums of Weighted Gaussians
Bayesian networks provide a method of rep resenting conditional independence between random variables and computing the prob ability distributions associated with these random variables. In this paper, we ex tend Bayesian network structures to compute probability density functions for continuous random variables. We make this extension by approximating prior and conditional den sities using...
متن کاملFast Computation With Two Algebraic Numbers
We propose fast algorithms for computing composed products and composed sums, as well as diamond products of univariate polynomials. These operations correspond to special resultants, that we compute using power sums of roots of polynomials, by means of their generating series.
متن کاملThe bucket box intersection (BBI) algorithm for fast approximative evaluation of diagonal mixture Gaussians
Today, most of the state-of-the-art speech recognizers are based on Hidden Markov modeling. Using semi-continuous or continuous density Hidden Markov Models, the computation of emission probabilities requires the evaluation of mixture Gaussian probability density functions. Since it is very expensive to evaluate all the Gaussians of the mixture density codebook, many recognizers only compute th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006